-
Couldn't load subscription status.
- Fork 700
[ET][Kernels] Increase Half/Bfloat16 support #13646
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ET][Kernels] Increase Half/Bfloat16 support #13646
Conversation
Add Half/Bfloat16 dtype support for the following ops: - bmm.out - max.dim_max - min.dim_min - scatter_add.out Differential Revision: [D80963875](https://our.internmc.facebook.com/intern/diff/D80963875/) [ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/13646
Note: Links to docs will display an error until the docs builds have been completed. ❌ 1 New FailureAs of commit 40e650a with merge base ed8cbc0 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
This pull request was exported from Phabricator. Differential Revision: D80963875 |
Add Half/Bfloat16 dtype support for the following ops: - bmm.out - max.dim_max - min.dim_min - scatter_add.out Differential Revision: [D80963875](https://our.internmc.facebook.com/intern/diff/D80963875/) [ghstack-poisoned]
Pull Request resolved: #13646 Add Half/Bfloat16 dtype support for the following ops: - bmm.out - max.dim_max - min.dim_min - scatter_add.out ghstack-source-id: 305497634 @exported-using-ghexport Differential Revision: [D80963875](https://our.internmc.facebook.com/intern/diff/D80963875/)
|
This pull request was exported from Phabricator. Differential Revision: D80963875 |
9aee622
into
gh/manuelcandales/133/base
Stack from ghstack (oldest at bottom):
Add Half/Bfloat16 dtype support for the following ops:
Differential Revision: D80963875